Subtle Facial Expression Synthesis using Motion Manifold Embedding and Nonlinear Decomposable Generative Models

نویسندگان

  • Chan-Su Lee
  • Yang Wang
  • Zhiguo Li
  • Atul Kanaujia
  • Ahmed Elgammal
  • Dimitris Samaras
  • Dimitris Metaxas
  • Xiangfeng Gu
  • Peisen Huang
چکیده

Facial motions convey personal characteristics and subtle emotional states. This paper presents a new framework to model facial motions of different people with multiple expression types from high resolution facial expression tracking data. We also provide a mechanism to animate subtle facial expressions based on video sequences. A conceptual motion manifold is used for a unified representation of facial motion dynamics. Subtle local motions in facial expressions are modeled by nonlinear mapping using empirical kernel map from an embedding manifold. We represent facial expressions in different people, as well as different expression type by a nonlinear decomposable generative model using multilinear analysis of the nonlinear mappings coefficient space. We can synthesize high resolution facial motions based on tracking of facial motions in video sequences by estimation of the model parameters.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Homeomorphic Manifold Analysis: Learning Decomposable Generative Models for Human Motion Analysis

If we consider the appearance of human motion such as gait, facial expression and gesturing, most of such activities result in nonlinear manifolds in the image space. Although the intrinsic body configuration manifolds might be very low in dimensionality, the resulting appearance manifold is challenging to model given various aspects that affects the appearance such as the view point, the perso...

متن کامل

Facial Expression Analysis Using Nonlinear Decomposable Generative Models

We present a new framework to represent and analyze dynamic facial motions using a decomposable generative model. In this paper, we consider facial expressions which lie on a one dimensional closed manifold, i.e., start from some configuration and coming back to the same configuration, while there are other sources of variability such as different classes of expression, and different people, et...

متن کامل

Homeomorphic Manifold Analysis: Learning Decomposable Generative Models for Human Motion Analysis

If we consider the appearance of human motion such as gait, facial expression and gesturing, most of such activities result in nonlinear manifolds in the image space. Although the intrinsic body configuration manifolds might be very low in dimensionality, the resulting appearance manifold is challenging to model given various aspects that affects the appearance such as the view point, the perso...

متن کامل

Style Adaptive Bayesian Tracking Using Explicit Manifold Learning

Characteristics of the 2D contour shape deformation in human motion contain rich information and can be useful for human identification, gender classification, 3D pose reconstruction and so on. In this paper we introduce a new approach for contour tracking for human motion using an explicit modeling of the motion manifold and learning a decomposable generative model. We use nonlinear dimensiona...

متن کامل

Modeling Human Motion Using Manifold Learning and Factorized Generative Models

OF THE DISSERTATION Modeling Human Motion Using Manifold Learning and Factorized Generative Models by Chan-Su Lee Dissertation Director: Ahmed Elgammal Modeling the dynamic shape and appearance of articulated moving objects is essential for human motion analysis, tracking, synthesis, and other computer vision problems. Modeling the shape and appearance of human motion is challenging due to the ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2006